Information Theory of Penalized Likelihoods and its Statistical Implications
نویسندگان
چکیده
We extend the correspondence between two-stage coding procedures in data compression and penalized likelihood procedures in statistical estimation. Traditionally, this had required restriction to countable parameter spaces. We show how to extend this correspondence in the uncountable parameter case. Leveraging the description length interpretations of penalized likelihood procedures we devise new techniques to derive adaptive risk bounds of such procedures. We show that the existence of certain countable coverings of the parameter space implies adaptive risk bounds and thus our theory is quite general. We apply our techniques to illustrate risk bounds for `1 type penalized procedures in canonical high dimensional statistical problems such as linear regression and Gaussian graphical Models. In the linear regression problem, we also demonstrate how the traditional l0 penalty times log(n) 2 plus lower order terms has a two stage description length interpretation and present risk bounds for this penalized likelihood procedure.
منابع مشابه
IFRS or IFRS-Based Domestic Standards: Implications for China’s Future Accounting System
People’s Republic of China has a long history of accounting and accounting reforms. This study focuses on “whether China should continue its IFRS-based domestic accounting standards or full convergence with the IFRS is more appropriate”? Both quantitative and qualitative approaches are applied to answer the research question of this work. Binary choice model has been used in the statistical ana...
متن کاملStatistical models: Conventional, penalized and hierarchical likelihood
We give an overview of statistical models and likelihood, together with two of its variants: penalized and hierarchical likelihood. The Kullback-Leibler divergence is referred to repeatedly in the literature, for defining the misspecification risk of a model and for grounding the likelihood and the likelihood cross-validation, which can be used for choosing weights in penalized likelihood. Fami...
متن کاملThe Mdl Principle, Penalized Likelihoods, and Statistical Risk
We determine, for both countable and uncountable collections of functions, informationtheoretic conditions on a penalty pen(f) such that the optimizer f̂ of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has statistical risk not more than the index of resolvability corresponding to the accuracy of the optimizer of the expected value of the criterion. If F is the linear span ...
متن کاملStatistical radar imaging of diffuse and specular targets using an expectation-maximization algorithm
Radar imaging is often posed as a problem of estimating deterministic reflectances observed through a linear mapping and additive Gaussian receiver noise. We consider an alternative view which considers the reflectances themselves to be a realization of a random process; imaging then involves estimating the parameters of that underlying process. Purely diffuse radar targets are modeled by a zer...
متن کاملMDL Procedures with ` 1 Penalty and their Statistical Risk Updated August 15 , 2008 Andrew
We review recently developed theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of ...
متن کامل